0%

(CVPR 2018) Density-aware Single Image De-raining using a Multi-stream Dense Network

Zhang H, Patel V M. Density-aware single image de-raining using a multi-stream dense network[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2018: 695-704.



1. Overview


1.1. Motivation

  • Non-uniform rain densities
  • Existing method do not consider the density of rain drops (lead to over de-rain or under de-rain)

[6]. Deep Detail Network. CVPR 2017
[33]. Deep Joint Rain Detection and Removal. CVPR 2017

In this paper, it proposed DID-MDN

  • joint (two stage) rain density estimation and de-raining (guided by the estimated rain-density label)
  • multi-stream (different scale feature)
  • use residual to represent rain-density feature
  • create a dataset containing rain-density label (heavy, medium, light)
  • video-based
  • prior-based (over-smooth details)
  • Multi-scale feature (U-Net, FCN, skip-connection)

1.3. Dataset

  • 12,000 for training
  • 1, 200 for testing
  • use PS to get different level rain density (noise level [5%, 35%], [35%, 65%], [65%, 95%])
  • link




2. Architecture




2.1. Residual-Aware Rain-Density Classifier

single network may not be sufficient enough to learn different rain-densities occurring in practice.

  • used for guiding the de-raining process
  • residual can better represent the rain feature
  • observe that fine-tune pre-trained model is not an efficient solution.

high-level feature focus on localizing the discriminative object (not small rain-streak), so it’s not good for density classify.

2.1.1. step

  • estimate residual (residual feature extraction network)
  • train classifier with residual (classifier)

2.1.2. Loss

  • firstly train residual network
  • then train classifier
  • finally joint optimized


2.1.3. Classifier (low, medium, high)



2.2. Multi-Stream Dense Network

  • Concat multi-stream feature and density label
  • Refinement

2.3. Multi-Stream



  • smaller rain-streak can be capture by small-scale feature
  • longer rain-streak can be capture by larger-scale feature

For each stream

  • six dense blocks + six transition layer
  • short path better for convergence



  • Dense1. 3-down + 3-up (7x7)
  • Dense2. 2-down + 2-no + 2-up (5x5)
  • Dense3. 1-down + 4-no + 1-up (3x3)

2.4. Total Loss





3. Experiments


4. Dataset

  • Train1. 12,000
  • Test1. 1,200
  • Test2. 1,000 (from Deep Detail Network)

4.1. Detail

  • random crop, horizontal flip
  • batch size 1
  • λF = 1

4.2. Ablation Study

  • VGG vs Residual classifier



  • Modules



  1. Single. single stream without label fusion
  2. Yang-Multi. multi-stream (dilated)
  3. Multi-no-label. multi stream without label fusion
  4. DID-MDN. multi-stream with label fusion


  • Over de-rain with blur. Single and Yang-Multi
  • Leave some rain-streak. Muti-no-label

4.3. Comparison



  • Synthetic


    [33, 41] leave some rain-streak.
    [6] remove some details.
  • Real



    1. long-thin rain-streak
    1. heavy rain
    1. small round rain
    1. medium reain

4.4. Inference Time